Tempering Backpropagation Networks: Not All Weights are Created Equal
نویسندگان
چکیده
Backpropagation learning algorithms typically collapse the network’s structure into a single vector of weight parameters to be optimized. We suggest that their performance may be improved by utilizing the structural information instead of discarding it, and introduce a framework for “tempering” each weight accordingly. In the tempering model, activation and error signals are treated as approximately independent random variables. The characteristic scale of weight changes is then matched to that of the residuals, allowing structural properties such as a node’s fan-in and fan-out to affect the local learning rate and backpropagated error. The model also permits calculation of an upper bound on the global learning rate for batch updates, which in turn leads to different update rules for bias vs. non-bias weights. This approach yields hitherto unparalleled performance on the family relations benchmark, a deep multi-layer network: for both batch learning with momentum and the delta-bar-delta algorithm, convergence at the optimal learning rate is sped up by more than an order of magnitude.
منابع مشابه
Backpropagation of Hebbian plasticity for lifelong learning
Hebbian plasticity allows biological agents to learn from their lifetime experience, extending the fixed information provided by evolutionary search. Conversely, backpropagation methods can build high-performance fixed-weights networks, but are not currently equipped to design networks with Hebbian connections. Here we use backpropagation to train fully-differentiable plastic networks, such tha...
متن کاملNeural Network Model of the Backpropagation Algorithm
We apply a neural network to model neural network learning algorithm itself. The process of weights updating in neural network is observed and stored into file. Later, this data is used to train another network, which then will be able to train neural networks by imitating the trained algorithm. We use backpropagation algorithm for both, for training, and for sampling the training process. We i...
متن کاملAn Approximation of the Error Backpropagation Algorithm in a Predictive Coding Network with Local Hebbian Synaptic Plasticity
To efficiently learn from feedback, cortical networks need to update synaptic weights on multiple levels of cortical hierarchy. An effective and well-known algorithm for computing such changes in synaptic weights is the error backpropagation algorithm. However, in this algorithm, the change in synaptic weights is a complex function of weights and activities of neurons not directly connected wit...
متن کاملInterval type-2 fuzzy weight adjustment for backpropagation neural networks with application in time series prediction
In this paper a new backpropagation learning method enhanced with type-2 fuzzy logic is presented. Simulation results and a comparative study among monolithic neural networks, neural network with type-1 fuzzy weights and neural network with type-2 fuzzy weights are presented to illustrate the advantages of the proposed method. In this work, type-2 fuzzy inference systems are used to obtain the ...
متن کاملStatistical efficiency of adaptive algorithms
The statistical efficiency of a learning algorithm applied to the adaptation of a given set of variable weights is defined as the ratio of the quality of the converged solution to the amount of data used in training the weights. Statistical efficiency is computed by averaging over an ensemble of learning experiences. A high quality solution is very close to optimal, while a low quality solution...
متن کامل